Serveur d'exploration Santé et pratique musicale

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Quantifying auditory temporal stability in a large database of recorded music.

Identifieur interne : 000F57 ( Main/Exploration ); précédent : 000F56; suivant : 000F58

Quantifying auditory temporal stability in a large database of recorded music.

Auteurs : Robert J. Ellis [Singapour] ; Zhiyan Duan [Singapour] ; Ye Wang [Singapour]

Source :

RBID : pubmed:25469636

Descripteurs français

English descriptors

Abstract

"Moving to the beat" is both one of the most basic and one of the most profound means by which humans (and a few other species) interact with music. Computer algorithms that detect the precise temporal location of beats (i.e., pulses of musical "energy") in recorded music have important practical applications, such as the creation of playlists with a particular tempo for rehabilitation (e.g., rhythmic gait training), exercise (e.g., jogging), or entertainment (e.g., continuous dance mixes). Although several such algorithms return simple point estimates of an audio file's temporal structure (e.g., "average tempo", "time signature"), none has sought to quantify the temporal stability of a series of detected beats. Such a method--a "Balanced Evaluation of Auditory Temporal Stability" (BEATS)--is proposed here, and is illustrated using the Million Song Dataset (a collection of audio features and music metadata for nearly one million audio files). A publically accessible web interface is also presented, which combines the thresholdable statistics of BEATS with queryable metadata terms, fostering potential avenues of research and facilitating the creation of highly personalized music playlists for clinical or recreational applications.

DOI: 10.1371/journal.pone.0110452
PubMed: 25469636
PubMed Central: PMC4254286


Affiliations:


Links toward previous steps (curation, corpus...)


Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Quantifying auditory temporal stability in a large database of recorded music.</title>
<author>
<name sortKey="Ellis, Robert J" sort="Ellis, Robert J" uniqKey="Ellis R" first="Robert J" last="Ellis">Robert J. Ellis</name>
<affiliation wicri:level="4">
<nlm:affiliation>School of Computing, National University of Singapore, Singapore, Singapore.</nlm:affiliation>
<country xml:lang="fr">Singapour</country>
<wicri:regionArea>School of Computing, National University of Singapore, Singapore</wicri:regionArea>
<orgName type="university">Université nationale de Singapour</orgName>
</affiliation>
</author>
<author>
<name sortKey="Duan, Zhiyan" sort="Duan, Zhiyan" uniqKey="Duan Z" first="Zhiyan" last="Duan">Zhiyan Duan</name>
<affiliation wicri:level="4">
<nlm:affiliation>School of Computing, National University of Singapore, Singapore, Singapore.</nlm:affiliation>
<country xml:lang="fr">Singapour</country>
<wicri:regionArea>School of Computing, National University of Singapore, Singapore</wicri:regionArea>
<orgName type="university">Université nationale de Singapour</orgName>
</affiliation>
</author>
<author>
<name sortKey="Wang, Ye" sort="Wang, Ye" uniqKey="Wang Y" first="Ye" last="Wang">Ye Wang</name>
<affiliation wicri:level="4">
<nlm:affiliation>School of Computing, National University of Singapore, Singapore, Singapore.</nlm:affiliation>
<country xml:lang="fr">Singapour</country>
<wicri:regionArea>School of Computing, National University of Singapore, Singapore</wicri:regionArea>
<orgName type="university">Université nationale de Singapour</orgName>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PubMed</idno>
<date when="2014">2014</date>
<idno type="RBID">pubmed:25469636</idno>
<idno type="pmid">25469636</idno>
<idno type="doi">10.1371/journal.pone.0110452</idno>
<idno type="pmc">PMC4254286</idno>
<idno type="wicri:Area/Main/Corpus">000E73</idno>
<idno type="wicri:explorRef" wicri:stream="Main" wicri:step="Corpus" wicri:corpus="PubMed">000E73</idno>
<idno type="wicri:Area/Main/Curation">000E73</idno>
<idno type="wicri:explorRef" wicri:stream="Main" wicri:step="Curation">000E73</idno>
<idno type="wicri:Area/Main/Exploration">000E73</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en">Quantifying auditory temporal stability in a large database of recorded music.</title>
<author>
<name sortKey="Ellis, Robert J" sort="Ellis, Robert J" uniqKey="Ellis R" first="Robert J" last="Ellis">Robert J. Ellis</name>
<affiliation wicri:level="4">
<nlm:affiliation>School of Computing, National University of Singapore, Singapore, Singapore.</nlm:affiliation>
<country xml:lang="fr">Singapour</country>
<wicri:regionArea>School of Computing, National University of Singapore, Singapore</wicri:regionArea>
<orgName type="university">Université nationale de Singapour</orgName>
</affiliation>
</author>
<author>
<name sortKey="Duan, Zhiyan" sort="Duan, Zhiyan" uniqKey="Duan Z" first="Zhiyan" last="Duan">Zhiyan Duan</name>
<affiliation wicri:level="4">
<nlm:affiliation>School of Computing, National University of Singapore, Singapore, Singapore.</nlm:affiliation>
<country xml:lang="fr">Singapour</country>
<wicri:regionArea>School of Computing, National University of Singapore, Singapore</wicri:regionArea>
<orgName type="university">Université nationale de Singapour</orgName>
</affiliation>
</author>
<author>
<name sortKey="Wang, Ye" sort="Wang, Ye" uniqKey="Wang Y" first="Ye" last="Wang">Ye Wang</name>
<affiliation wicri:level="4">
<nlm:affiliation>School of Computing, National University of Singapore, Singapore, Singapore.</nlm:affiliation>
<country xml:lang="fr">Singapour</country>
<wicri:regionArea>School of Computing, National University of Singapore, Singapore</wicri:regionArea>
<orgName type="university">Université nationale de Singapour</orgName>
</affiliation>
</author>
</analytic>
<series>
<title level="j">PloS one</title>
<idno type="eISSN">1932-6203</idno>
<imprint>
<date when="2014" type="published">2014</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Algorithms (MeSH)</term>
<term>Auditory Perception (physiology)</term>
<term>Databases, Factual (MeSH)</term>
<term>Hearing (MeSH)</term>
<term>Humans (MeSH)</term>
<term>Music (MeSH)</term>
</keywords>
<keywords scheme="KwdFr" xml:lang="fr">
<term>Algorithmes (MeSH)</term>
<term>Bases de données factuelles (MeSH)</term>
<term>Humains (MeSH)</term>
<term>Musique (MeSH)</term>
<term>Ouïe (MeSH)</term>
<term>Perception auditive (physiologie)</term>
</keywords>
<keywords scheme="MESH" qualifier="physiologie" xml:lang="fr">
<term>Perception auditive</term>
</keywords>
<keywords scheme="MESH" qualifier="physiology" xml:lang="en">
<term>Auditory Perception</term>
</keywords>
<keywords scheme="MESH" xml:lang="en">
<term>Algorithms</term>
<term>Databases, Factual</term>
<term>Hearing</term>
<term>Humans</term>
<term>Music</term>
</keywords>
<keywords scheme="MESH" xml:lang="fr">
<term>Algorithmes</term>
<term>Bases de données factuelles</term>
<term>Humains</term>
<term>Musique</term>
<term>Ouïe</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">"Moving to the beat" is both one of the most basic and one of the most profound means by which humans (and a few other species) interact with music. Computer algorithms that detect the precise temporal location of beats (i.e., pulses of musical "energy") in recorded music have important practical applications, such as the creation of playlists with a particular tempo for rehabilitation (e.g., rhythmic gait training), exercise (e.g., jogging), or entertainment (e.g., continuous dance mixes). Although several such algorithms return simple point estimates of an audio file's temporal structure (e.g., "average tempo", "time signature"), none has sought to quantify the temporal stability of a series of detected beats. Such a method--a "Balanced Evaluation of Auditory Temporal Stability" (BEATS)--is proposed here, and is illustrated using the Million Song Dataset (a collection of audio features and music metadata for nearly one million audio files). A publically accessible web interface is also presented, which combines the thresholdable statistics of BEATS with queryable metadata terms, fostering potential avenues of research and facilitating the creation of highly personalized music playlists for clinical or recreational applications.</div>
</front>
</TEI>
<pubmed>
<MedlineCitation Status="MEDLINE" Owner="NLM">
<PMID Version="1">25469636</PMID>
<DateCompleted>
<Year>2015</Year>
<Month>08</Month>
<Day>11</Day>
</DateCompleted>
<DateRevised>
<Year>2019</Year>
<Month>02</Month>
<Day>23</Day>
</DateRevised>
<Article PubModel="Electronic-eCollection">
<Journal>
<ISSN IssnType="Electronic">1932-6203</ISSN>
<JournalIssue CitedMedium="Internet">
<Volume>9</Volume>
<Issue>12</Issue>
<PubDate>
<Year>2014</Year>
</PubDate>
</JournalIssue>
<Title>PloS one</Title>
<ISOAbbreviation>PLoS One</ISOAbbreviation>
</Journal>
<ArticleTitle>Quantifying auditory temporal stability in a large database of recorded music.</ArticleTitle>
<Pagination>
<MedlinePgn>e110452</MedlinePgn>
</Pagination>
<ELocationID EIdType="doi" ValidYN="Y">10.1371/journal.pone.0110452</ELocationID>
<Abstract>
<AbstractText>"Moving to the beat" is both one of the most basic and one of the most profound means by which humans (and a few other species) interact with music. Computer algorithms that detect the precise temporal location of beats (i.e., pulses of musical "energy") in recorded music have important practical applications, such as the creation of playlists with a particular tempo for rehabilitation (e.g., rhythmic gait training), exercise (e.g., jogging), or entertainment (e.g., continuous dance mixes). Although several such algorithms return simple point estimates of an audio file's temporal structure (e.g., "average tempo", "time signature"), none has sought to quantify the temporal stability of a series of detected beats. Such a method--a "Balanced Evaluation of Auditory Temporal Stability" (BEATS)--is proposed here, and is illustrated using the Million Song Dataset (a collection of audio features and music metadata for nearly one million audio files). A publically accessible web interface is also presented, which combines the thresholdable statistics of BEATS with queryable metadata terms, fostering potential avenues of research and facilitating the creation of highly personalized music playlists for clinical or recreational applications.</AbstractText>
</Abstract>
<AuthorList CompleteYN="Y">
<Author ValidYN="Y">
<LastName>Ellis</LastName>
<ForeName>Robert J</ForeName>
<Initials>RJ</Initials>
<AffiliationInfo>
<Affiliation>School of Computing, National University of Singapore, Singapore, Singapore.</Affiliation>
</AffiliationInfo>
</Author>
<Author ValidYN="Y">
<LastName>Duan</LastName>
<ForeName>Zhiyan</ForeName>
<Initials>Z</Initials>
<AffiliationInfo>
<Affiliation>School of Computing, National University of Singapore, Singapore, Singapore.</Affiliation>
</AffiliationInfo>
</Author>
<Author ValidYN="Y">
<LastName>Wang</LastName>
<ForeName>Ye</ForeName>
<Initials>Y</Initials>
<AffiliationInfo>
<Affiliation>School of Computing, National University of Singapore, Singapore, Singapore.</Affiliation>
</AffiliationInfo>
</Author>
</AuthorList>
<Language>eng</Language>
<PublicationTypeList>
<PublicationType UI="D016428">Journal Article</PublicationType>
<PublicationType UI="D013485">Research Support, Non-U.S. Gov't</PublicationType>
</PublicationTypeList>
<ArticleDate DateType="Electronic">
<Year>2014</Year>
<Month>12</Month>
<Day>03</Day>
</ArticleDate>
</Article>
<MedlineJournalInfo>
<Country>United States</Country>
<MedlineTA>PLoS One</MedlineTA>
<NlmUniqueID>101285081</NlmUniqueID>
<ISSNLinking>1932-6203</ISSNLinking>
</MedlineJournalInfo>
<CitationSubset>IM</CitationSubset>
<MeshHeadingList>
<MeshHeading>
<DescriptorName UI="D000465" MajorTopicYN="Y">Algorithms</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName UI="D001307" MajorTopicYN="N">Auditory Perception</DescriptorName>
<QualifierName UI="Q000502" MajorTopicYN="Y">physiology</QualifierName>
</MeshHeading>
<MeshHeading>
<DescriptorName UI="D016208" MajorTopicYN="N">Databases, Factual</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName UI="D006309" MajorTopicYN="N">Hearing</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName UI="D006801" MajorTopicYN="N">Humans</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName UI="D009146" MajorTopicYN="Y">Music</DescriptorName>
</MeshHeading>
</MeshHeadingList>
</MedlineCitation>
<PubmedData>
<History>
<PubMedPubDate PubStatus="received">
<Year>2014</Year>
<Month>07</Month>
<Day>17</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="accepted">
<Year>2014</Year>
<Month>09</Month>
<Day>10</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="entrez">
<Year>2014</Year>
<Month>12</Month>
<Day>4</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="pubmed">
<Year>2014</Year>
<Month>12</Month>
<Day>4</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="medline">
<Year>2015</Year>
<Month>8</Month>
<Day>12</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
</History>
<PublicationStatus>epublish</PublicationStatus>
<ArticleIdList>
<ArticleId IdType="pubmed">25469636</ArticleId>
<ArticleId IdType="doi">10.1371/journal.pone.0110452</ArticleId>
<ArticleId IdType="pii">PONE-D-14-32051</ArticleId>
<ArticleId IdType="pmc">PMC4254286</ArticleId>
</ArticleIdList>
<ReferenceList>
<Reference>
<Citation>Percept Psychophys. 1993 Sep;54(3):277-86</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">8414886</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Clin Rehabil. 2005 Oct;19(7):695-713</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">16250189</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Hum Mov Sci. 2007 Aug;26(4):555-89</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">17618701</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>J Exp Psychol Gen. 2012 Feb;141(1):54-75</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">21767048</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Parkinsons Dis. 2010 Jul 13;2010:483530</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">20976086</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Int J Geriatr Psychiatry. 2013 Sep;28(9):914-24</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">23225749</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Brain. 1994 Oct;117 ( Pt 5):1169-81</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">7953597</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Arch Phys Med Rehabil. 2001 Aug;82(8):1050-6</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">11494184</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>J Exp Psychol Hum Percept Perform. 1997 Jun;23(3):808-22</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">9180045</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>J Exp Psychol Hum Percept Perform. 1982 Feb;8(1):46-57</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">6460084</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Atten Percept Psychophys. 2010 Nov;72(8):2274-88</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">21097869</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Psychol Res. 1978 Oct 5;40(2):173-81</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">693733</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Int Rev Sport Exerc Psychol. 2012 Mar;5(1):44-66</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">22577472</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Percept Psychophys. 2007 Jul;69(5):709-18</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">17929694</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Psychol Rev. 1989 Jul;96(3):459-91</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">2756068</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Mov Disord. 2008 Jul 30;23(10):1446-52</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">18512747</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Int Rev Sport Exerc Psychol. 2012 Mar;5(1):67-84</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">22577473</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Brain Cogn. 2005 Jun;58(1):133-47</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">15878734</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Psychol Bull. 2001 Jan;127(1):22-44</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">11271754</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Proc Natl Acad Sci U S A. 2009 Feb 17;106(7):2468-71</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">19171894</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>J Acoust Soc Am. 2012 May;131(5):4013-22</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">22559374</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Proc Natl Acad Sci U S A. 2010 Mar 30;107(13):5768-73</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">20231438</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Percept Psychophys. 1989 Apr;45(4):291-6</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">2710629</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Int J Rehabil Res. 2008 Jun;31(2):155-7</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">18467930</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Mov Disord. 2004 Aug;19(8):871-84</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">15300651</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Psychon Bull Rev. 2013 Jun;20(3):403-52</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">23397235</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Mov Disord. 2007 Mar 15;22(4):451-60; quiz 600</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">17133526</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>J Neurol Sci. 2003 Aug 15;212(1-2):47-53</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">12809998</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Atten Percept Psychophys. 2010 Apr;72(3):561-82</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">20348562</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Ann N Y Acad Sci. 2012 Apr;1252:85-91</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">22524344</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Int J Psychophysiol. 1995 Apr;19(3):193-201</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">7558986</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Cogn Psychol. 2000 Nov;41(3):254-311</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">11032658</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>J Sports Med Phys Fitness. 1991 Mar;31(1):100-3</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">1861474</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Front Syst Neurosci. 2014 May 13;8:57</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">24860439</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>J Neuroeng Rehabil. 2005 Jul 20;2:19</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">16033650</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Psychosom Med. 2000 May-Jun;62(3):386-93</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">10845352</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Mov Disord. 1996 Mar;11(2):193-200</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">8684391</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Mov Disord. 1999 Sep;14(5):808-19</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">10495043</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>J Music Ther. 1998;35(4):228-241</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">10519837</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Cortex. 2009 Jan;45(1):4-17</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">19046745</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Front Aging Neurosci. 2010 Jul 21;2:null</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">20725636</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Percept Psychophys. 2005 Oct;67(7):1150-60</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">16502837</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Parkinsonism Relat Disord. 2012 Jan;18 Suppl 1:S114-9</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">22166406</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>J Music Ther. 2001 Summer;38(2):82-96</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">11469917</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Mov Disord. 2002 Nov;17(6):1148-60</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">12465051</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>J Am Geriatr Soc. 2006 Aug;54(8):1241-4</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">16913992</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Osteoporos Int. 2010 Aug;21(8):1295-306</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">20195846</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Chaos. 2009 Jun;19(2):026113</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">19566273</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>J Acoust Soc Am. 2008 Dec;124(6):4024-41</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">19206825</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Arch Phys Med Rehabil. 2013 Mar;94(3):562-70</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">23127307</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Disabil Rehabil. 2013 Jan;35(2):164-76</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">22681598</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>J Sports Sci. 2012 May;30(9):953-6</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">22512537</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Neurorehabil Neural Repair. 2007 Sep-Oct;21(5):455-9</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">17426347</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>PLoS One. 2009;4(10):e7487</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">19834599</ArticleId>
</ArticleIdList>
</Reference>
<Reference>
<Citation>Psychol Sci. 2001 May;12(3):248-51</Citation>
<ArticleIdList>
<ArticleId IdType="pubmed">11437309</ArticleId>
</ArticleIdList>
</Reference>
</ReferenceList>
</PubmedData>
</pubmed>
<affiliations>
<list>
<country>
<li>Singapour</li>
</country>
<orgName>
<li>Université nationale de Singapour</li>
</orgName>
</list>
<tree>
<country name="Singapour">
<noRegion>
<name sortKey="Ellis, Robert J" sort="Ellis, Robert J" uniqKey="Ellis R" first="Robert J" last="Ellis">Robert J. Ellis</name>
</noRegion>
<name sortKey="Duan, Zhiyan" sort="Duan, Zhiyan" uniqKey="Duan Z" first="Zhiyan" last="Duan">Zhiyan Duan</name>
<name sortKey="Wang, Ye" sort="Wang, Ye" uniqKey="Wang Y" first="Ye" last="Wang">Ye Wang</name>
</country>
</tree>
</affiliations>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Sante/explor/SanteMusiqueV1/Data/Main/Exploration
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000F57 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Main/Exploration/biblio.hfd -nk 000F57 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Sante
   |area=    SanteMusiqueV1
   |flux=    Main
   |étape=   Exploration
   |type=    RBID
   |clé=     pubmed:25469636
   |texte=   Quantifying auditory temporal stability in a large database of recorded music.
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Main/Exploration/RBID.i   -Sk "pubmed:25469636" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Main/Exploration/biblio.hfd   \
       | NlmPubMed2Wicri -a SanteMusiqueV1 

Wicri

This area was generated with Dilib version V0.6.38.
Data generation: Mon Mar 8 15:23:44 2021. Site generation: Mon Mar 8 15:23:58 2021